Measure of Similarity between GMMs Based on Geometry-Aware Dimensionality Reduction

نویسندگان

چکیده

Gaussian Mixture Models (GMMs) are used in many traditional expert systems and modern artificial intelligence tasks such as automatic speech recognition, image recognition retrieval, pattern speaker verification, financial forecasting applications others, simple statistical representations of underlying data. Those typically require high-dimensional GMM components that consume large computing resources increase computation time. On the other hand, real-time computationally efficient algorithms for reason, various similarity measures dimensionality reduction techniques have been examined to reduce computational complexity. In this paper, a novel measure is proposed. The based on recently presented nonlinear geometry-aware algorithm manifold Symmetric Positive Definite (SPD) matrices. applied over SPD original local neighborhood information from parameter space preserved by preserving distance mean. Instead dealing with space, method operates much lower-dimensional transformed parameters. Resolving between reduced calculating among was tested within texture task where superior state-of-the-art performance terms trade-off accuracy complexity has achieved comparison all baseline measures.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dimensionality Invariant Similarity Measure

This paper presents a new similarity measure to be used for general tasks including supervised learning, which is represented by the K-nearest neighbor classifier (KNN). The proposed similarity measure is invariant to large differences in some dimensions in the feature space. The proposed metric is proved mathematically to be a metric. To test its viability for different applications, the KNN u...

متن کامل

On the geometry of similarity search: dimensionality curse and concentration of measure

We suggest that the curse of dimensionality affecting the similarity-based search in large datasets is a manifestation of the phenomenon of concentration of measure on highdimensional structures. We prove that, under certain geometric assumptions on the query domain Ω and the dataset X , if Ω satisfies the so-called concentration property, then for most query points x the ball of radius (1 + ǫ)...

متن کامل

Dimensional Geometry , Curse of Dimensionality , Dimension Reduction

High-dimensional vectors are ubiquitous in algorithms and this lecture seeks to introduce some common properties of these vectors. We encounter the so-called curse of dimensionality which refers to the fact that algorithms are simply harder to design in high dimensions and often have a running time exponential in the dimension. We also show that it is possible to reduce the dimension of a datas...

متن کامل

Embedded Map Projection for Dimensionality Reduction-Based Similarity Search

We describe a dimensionality reduction method based on data point projection in an output space obtained by embedding the Growing Hierarchical Self Organizing Maps (GHSOM) computed from a training data-set. The dimensionality reduction is used in a similarity search framework whose aim is to efficiently retrieve similar objects on the basis of the Euclidean distance among high dimensional featu...

متن کامل

A distance measure between GMMs based on the unscented transform and its application to speaker recognition

This paper proposes a dissimilarity measure between two Gaussian mixture models (GMM). Computing a distance measure between two GMMs that were learned from speech segments is a key element in speaker verification, speaker segmentation and many other related applications. A natural measure between two distributions is the Kullback-Leibler divergence. However, it cannot be analytically computed i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2022

ISSN: ['2227-7390']

DOI: https://doi.org/10.3390/math11010175